منابع مشابه
Multi-Timescale Long Short-Term Memory Neural Network for Modelling Sentences and Documents
Neural network based methods have obtained great progress on a variety of natural language processing tasks. However, it is still a challenge task to model long texts, such as sentences and documents. In this paper, we propose a multi-timescale long short-termmemory (MT-LSTM) neural network to model long texts. MTLSTM partitions the hidden states of the standard LSTM into several groups. Each g...
متن کاملAuditory Short-Term Memory Behaves Like Visual Short-Term Memory
Are the information processing steps that support short-term sensory memory common to all the senses? Systematic, psychophysical comparison requires identical experimental paradigms and comparable stimuli, which can be challenging to obtain across modalities. Participants performed a recognition memory task with auditory and visual stimuli that were comparable in complexity and in their neural ...
متن کاملthe effects of keyword and context methods on pronunciation and receptive/ productive vocabulary of low-intermediate iranian efl learners: short-term and long-term memory in focus
از گذشته تا کنون، تحقیقات بسیاری صورت گرفته است که همگی به گونه ای بر مثمر ثمر بودن استفاده از استراتژی های یادگیری لغت در یک زبان بیگانه اذعان داشته اند. این تحقیق به بررسی تاثیر دو روش مختلف آموزش واژگان انگلیسی (کلیدی و بافتی) بر تلفظ و دانش لغوی فراگیران ایرانی زیر متوسط زبان انگلیسی و بر ماندگاری آن در حافظه می پردازد. به این منظور، تعداد شصت نفر از زبان آموزان ایرانی هشت تا چهارده ساله با...
15 صفحه اولShort-term Associative Memory
One of the simplest associative memories is the Willshaw Network (Willshaw, Buneman & Longuet-Higgins, 1969). Like other associative networks however (e.g., Hopfield, 1982), it fails completely as a memory device as soon as its capacity is exceeded. Three methods of synaptic change are analysed, decay, ageing and depression, under which this catastrophic failure can be preempted and stability u...
متن کاملLong Short-term Memory
Model compression is significant for the wide adoption of Recurrent Neural Networks (RNNs) in both user devices possessing limited resources and business clusters requiring quick responses to large-scale service requests. This work aims to learn structurally-sparse Long Short-Term Memory (LSTM) by reducing the sizes of basic structures within LSTM units, including input updates, gates, hidden s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Verbal Learning and Verbal Behavior
سال: 1968
ISSN: 0022-5371
DOI: 10.1016/s0022-5371(68)80050-7